Goto

Collaborating Authors

 recovery problem



Fast recovery from a union of subspaces

Chinmay Hegde, Piotr Indyk, Ludwig Schmidt

Neural Information Processing Systems

We address the problem of recovering a high-dimensional but structured vector from linear observations in a general setting where the vector can come from an arbitrary union of subspaces.



our double over-parameterization approach for robust recovery problems to be novel and appreciate our theoretical

Neural Information Processing Systems

We thank the reviewers for their detailed and thoughtful comments. All minor comments and corrections will be addressed in the final version. In the following, we address each reviewer's comments in detail one by one. Q1: Natural images may not have low-rank structures. A1: We did not model natural images by low-rank structures.


Discovery of Probabilistic Dirichlet-to-Neumann Maps on Graphs

Propp, Adrienne M., Actor, Jonas A., Walker, Elise, Owhadi, Houman, Trask, Nathaniel, Tartakovsky, Daniel M.

arXiv.org Machine Learning

Dirichlet-to-Neumann maps enable the coupling of multiphysics simulations across computational subdomains by ensuring continuity of state variables and fluxes at artificial interfaces. We present a novel method for learning Dirichlet-to-Neumann maps on graphs using Gaussian processes, specifically for problems where the data obey a conservation constraint from an underlying partial differential equation. Our approach combines discrete exterior calculus and nonlinear optimal recovery to infer relationships between vertex and edge values. This framework yields data-driven predictions with uncertainty quantification across the entire graph, even when observations are limited to a subset of vertices and edges. By optimizing over the reproducing kernel Hilbert space norm while applying a maximum likelihood estimation penalty on kernel complexity, our method ensures that the resulting surrogate strictly enforces conservation laws without overfitting. We demonstrate our method on two representative applications: subsurface fracture networks and arterial blood flow. Our results show that the method maintains high accuracy and well-calibrated uncertainty estimates even under severe data scarcity, highlighting its potential for scientific applications where limited data and reliable uncertainty quantification are critical.


Time--Data Tradeoffs by Aggressive Smoothing

John J. Bruer, Joel A. Tropp, Volkan Cevher, Stephen Becker

Neural Information Processing Systems

This paper proposes a tradeoff between sample complexity and computation time that applies to statistical estimators based on convex optimization. As the amount of data increases, we can smooth optimization problems more and more aggressively to achieve accurate estimates more quickly. This work provides theoretical and experimental evidence of this tradeoff for a class of regularized linear inverse problems.


Accelerating Ill-conditioned Hankel Matrix Recovery via Structured Newton-like Descent

Cai, HanQin, Huang, Longxiu, Lu, Xiliang, You, Juntao

arXiv.org Machine Learning

This paper studies the robust Hankel recovery problem, which simultaneously removes the sparse outliers and fulfills missing entries from the partial observation. We propose a novel non-convex algorithm, coined Hankel Structured Newton-Like Descent (HSNLD), to tackle the robust Hankel recovery problem. HSNLD is highly efficient with linear convergence, and its convergence rate is independent of the condition number of the underlying Hankel matrix. The recovery guarantee has been established under some mild conditions. Numerical experiments on both synthetic and real datasets show the superior performance of HSNLD against state-of-the-art algorithms.


Fast White-Box Adversarial Streaming Without a Random Oracle

Feng, Ying, Jain, Aayush, Woodruff, David P.

arXiv.org Artificial Intelligence

Recently, the question of adversarially robust streaming, where the stream is allowed to depend on the randomness of the streaming algorithm, has gained a lot of attention. In this work, we consider a strong white-box adversarial model (Ajtai et al. PODS 2022), in which the adversary has access to all past random coins and the parameters used by the streaming algorithm. We focus on the sparse recovery problem and extend our result to other tasks such as distinct element estimation and low-rank approximation of matrices and tensors. The main drawback of previous work is that it requires a random oracle, which is especially problematic in the streaming model since the amount of randomness is counted in the space complexity of a streaming algorithm. Also, the previous work suffers from large update time. We construct a near-optimal solution for the sparse recovery problem in white-box adversarial streams, based on the subexponentially secure Learning with Errors assumption. Importantly, our solution does not require a random oracle and has a polylogarithmic per item processing time. We also give results in a related white-box adversarially robust distributed model. Our constructions are based on homomorphic encryption schemes satisfying very mild structural properties that are currently satisfied by most known schemes.



Exact and Stable Recovery of Pairwise Interaction Tensors

Neural Information Processing Systems

Tensor completion from incomplete observations is a problem of significant practical interest. However, it is unlikely that there exists an efficient algorithm with provable guarantee to recover a general tensor from a limited number of observations. In this paper, we study the recovery algorithm for pairwise interaction tensors, which has recently gained considerable attention for modeling multiple attribute data due to its simplicity and effectiveness.